Concentration and moment inequalities for polynomials of independent random variables

نویسندگان

  • Warren Schudy
  • Maxim Sviridenko
چکیده

In this work we design a general method for proving moment inequalities for polynomials of independent random variables. Our method works for a wide range of random variables including Gaussian, Boolean, exponential, Poisson and many others. We apply our method to derive general concentration inequalities for polynomials of independent random variables. We show that our method implies concentration inequalities for some previously open problems, e.g. permanent of random symmetric matrices. We show that our concentration inequality is stronger than the well-known concentration inequality due to Kim and Vu [31]. The main advantage of our method in comparison with the existing ones is a wide range of random variables we can handle and bounds for previously intractable regimes of high degree polynomials and small expectations. On the negative side we show that even for boolean random variables each term in our concentration inequality is tight.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Moment inequalities for functions of independent random variables

A general method for obtaining moment inequalities for functions of independent random variables is presented. It is a generalization of the entropy method which has been used to derive concentration inequalities for such functions [7], and is based on a generalized tensorization inequality due to Lata la and Oleszkiewicz [25]. The new inequalities prove to be a versatile tool in a wide range o...

متن کامل

Moment Inequalities for Functions of Independent Random Variables by Stéphane Boucheron,1 Olivier Bousquet,

A general method for obtaining moment inequalities for functions of independent random variables is presented. It is a generalization of the entropy method which has been used to derive concentration inequalities for such functions [Boucheron, Lugosi and Massart Ann. Probab. 31 (2003) 1583–1614], and is based on a generalized tensorization inequality due to Latała and Oleszkiewicz [Lecture Note...

متن کامل

Bernstein-like Concentration and Moment Inequalities for Polynomials of Independent Random Variables: Multilinear Case

Polynomials of independent random variables arise in a variety of fields such as Machine Learning, Analysis of Boolean Functions, Additive Combinatorics, Random Graphs Theory, Stochastic Partial Differential Equations etc. They naturally model the expected value of objective function (or lefthand side of constraints) for randomized rounding algorithms for non-linear optimization problems where ...

متن کامل

SOME PROBABILISTIC INEQUALITIES FOR FUZZY RANDOM VARIABLES

In this paper, the concepts of positive dependence and linearlypositive quadrant dependence are introduced for fuzzy random variables. Also,an inequality is obtained for partial sums of linearly positive quadrant depen-dent fuzzy random variables. Moreover, a weak law of large numbers is estab-lished for linearly positive quadrant dependent fuzzy random variables. Weextend some well known inequ...

متن کامل

Some Probability Inequalities for Quadratic Forms of Negatively Dependent Subgaussian Random Variables

In this paper, we obtain the upper exponential bounds for the tail probabilities of the quadratic forms for negatively dependent subgaussian random variables. In particular the law of iterated logarithm for quadratic forms of independent subgaussian random variables is generalized to the case of negatively dependent subgaussian random variables.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012